Unitary Approximate Message Passing for Sparse Bayesian Learning
نویسندگان
چکیده
Sparse Bayesian learning (SBL) can be implemented with low complexity based on the approximate message passing (AMP) algorithm. However, it does not work well for a generic measurement matrix, which may cause AMP to diverge. Damped has been used SBL alleviate problem at cost of reducing convergence speed. In this work, we propose new algorithm structured variational inference, leveraging unitary transformation (UAMP). Both single vector and multiple problems are investigated. It is shown that, compared stateof- the-art AMP-based algorithms, proposed UAMPSBL more robust efficient, leading remarkably better performance.
منابع مشابه
Bayesian Optimal Approximate Message Passing to Recover Structured Sparse Signals
We present a novel compressed sensing recovery algorithm – termed Bayesian Optimal Structured Signal Approximate Message Passing (BOSSAMP) – that jointly exploits the prior distribution and the structured sparsity of a signal that shall be recovered from noisy linear measurements. Structured sparsity is inherent to group sparse and jointly sparse signals. Our algorithm is based on approximate m...
متن کاملApproximate Message Passing with Unitary Transformation
Approximate message passing (AMP) and its variants, developed based on loopy belief propagation, are attractive for estimating a vector x from a noisy version of z = Ax, which arises in many applications. For a large A with i. i. d. elements, AMP can be characterized by the state evolution and exhibits fast convergence. However, it has been shown that, AMP may easily diverge for a generic A. In...
متن کاملSwept Approximate Message Passing for Sparse Estimation
Approximate Message Passing (AMP) has been shown to be a superior method for inference problems, such as the recovery of signals from sets of noisy, lower-dimensionality measurements, both in terms of reconstruction accuracy and in computational efficiency. However, AMP suffers from serious convergence issues in contexts that do not exactly match its assumptions. We propose a new approach to st...
متن کاملApproximate Message Passing
In this note, I summarize Sections 5.1 and 5.2 of Arian Maleki’s PhD thesis. 1 Notation We denote scalars by small letters e.g. a, b, c, . . ., vectors by boldface small letters e.g. λ,α,x, . . ., matrices by boldface capital letter e.g. A,B,C, . . ., (subsets of) natural numbers by capital letters e.g. N,M, . . .. We denote i’th element of a vector a by ai and (i, j)’th entry of a matrix A by ...
متن کاملAn Approach to Complex Bayesian-optimal Approximate Message Passing
In this work we aim to solve the compressed sensing problem for the case of a complex unknown vector by utilizing the Bayesian-optimal structured signal approximate message passing (BOSSAMP) algorithm on the jointly sparse real and imaginary parts of the unknown. By introducing a latent activity variable, BOSSAMP separates the tasks of activity detection and value estimation to overcome the pro...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEE Transactions on Signal Processing
سال: 2021
ISSN: ['1053-587X', '1941-0476']
DOI: https://doi.org/10.1109/tsp.2021.3114985